The Approximate Minimization of Functionals

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Approximate Minimization of Functionals*

This paper considers in general the problem of finding the minimum of a given functional f(u) over a set B by approximately minimizing a sequence of functionals /„(«„) over a "discretized" set B„; theorems are given proving the convergence of the approximating points un in Bn to the desired point u in B. Applications are given to the Rayleigh-Ritz method, regularization, Chebyshev solution of d...

متن کامل

Almost multiplicative linear functionals and approximate spectrum

We define a new type of spectrum, called δ-approximate spectrum, of an element a in a complex unital Banach algebra A and show that the δ-approximate spectrum σ_δ (a) of a is compact. The relation between the δ-approximate spectrum and the usual spectrum is investigated. Also an analogue of the classical Gleason-Kahane-Zelazko theorem is established: For each ε>0, there is δ>0 such that if ϕ is...

متن کامل

Minimization of entropy functionals

Entropy functionals (i.e. convex integral functionals) and extensions of these functionals are minimized on convex sets. This paper is aimed at reducing as much as possible the assumptions on the constraint set. Dual equalities and characterizations of the minimizers are obtained with weak constraint qualifications.

متن کامل

almost multiplicative linear functionals and approximate spectrum

we define a new type of spectrum, called δ-approximate spectrum, of an element a in a complex unital banach algebra a and show that the δ-approximate spectrum σ_δ (a) of a is compact. the relation between the δ-approximate spectrum and the usual spectrum is investigated. also an analogue of the classical gleason-kahane-zelazko theorem is established: for each ε>0, there is δ>0 such that if ϕ is...

متن کامل

Minimization of Error Functionals over Perceptron Networks

Supervised learning of perceptron networks is investigated as an optimization problem. It is shown that both the theoretical and the empirical error functionals achieve minima over sets of functions computable by networks with a given number n of perceptrons. Upper bounds on rates of convergence of these minima with n increasing are derived. The bounds depend on a certain regularity of training...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics of Computation

سال: 1972

ISSN: 0025-5718

DOI: 10.2307/2005894